The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act

E-paper

"The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act" by researcher Christina Dinar, focuses on the challenges faced by the queer community in Europe and offers detailed recommendations for the forthcoming EU Digital Services Act.

20210610_hb_discrimination rainbow.jpg

This paper is one of two complementary publications addressing the cultural, technical, and structural problems in current moderation practices—particularly the double standards that platforms apply to those from marginalized communities. Taken together, the papers offer a transatlantic perspective on this important global issue.

"Algorithmic misogynoir in content moderation practice" by researcher Brandeis Marshall, offers an intersectional analysis of the experiences of Black American women on social media platforms

2021-06-21 09_20_31-Photos.png

Download the e-paper "The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act" by Christina Dinar

Open, public, and rational discourse is often considered the heart of democracy, and social media platforms have provided key infrastructure for exactly this purpose. Social media, freed from the constraints of traditional media gatekeepers, facilitates content created and shared by users themselves. It has enabled social movements and political change, especially for those suffering from structural exclusion and suppression. With the help of social media, movements like #MeToo, #BlackLivesMatter, and #FridaysForFuture created unprecedented and powerful global impacts.

However, not all user-generated content amplifies movements for equity and gender democracy. Social media is a space “of feminist activism [but] at the same time [a space] of surveillance and punishment for feminist activism and activity”. Social media is also a place for populist right-wing content promoting authoritarian and minority-threatening ideologies. This type of content is often perceived as “free speech,” even when it harms democracy and destabilizes democratic institutions, as in the case of the storming of the United States Capitol in January 2021. Now able to bypass journalists and standards of reporting, right-wing counterculture flourishes in an “alternative platform” and can quickly spread misinformation and violent ideas.

In this environment, social media companies frequently cast themselves merely as hosting services that enable free speech—yet they are not neutral platforms. These websites moderate or curate the content that users see, often in poor, discriminatory, or opaque ways that rely on simplistic technical solutions.

Thoughtful content moderation is crucial for building a healthy, safe, and inclusive internet, especially as the number of social media users grows. Today, 46% of the population in eastern Europe uses these platforms, and that number rises to 67% in northern Europe. Yet content moderation has not always received the attention it deserves. The European Commission’s draft legislation on regulating high-risk AI, released in April 2021, does not include content moderation, even though many content moderation systems rely on artificial intelligence.

This report will show how current content moderation policies are filled with inconsistencies and double standards that often hurt marginalized communities. The report will then address models of content moderation and current EU regulatory approaches, focusing especially on the aims and obstacles of the proposed Digital Services Act. It will provide recommendations on how to build content moderation practices that protect marginalized communities. Ultimately, the report argues that a stronger democratization of content moderation is necessary to build infrastructure that helps gender democracy become a key part of internet culture.